Learning Gaussian Process Kernels via Hierarchical Bayes
نویسندگان
چکیده
We present a novel method for learning with Gaussian process regression in a hierarchical Bayesian framework. In a first step, kernel matrices on a fixed set of input points are learned from data using a simple and efficient EM algorithm. This step is nonparametric, in that it does not require a parametric form of covariance function. In a second step, kernel functions are fitted to approximate the learned covariance matrix using a generalized Nyström method, which results in a complex, data driven kernel. We evaluate our approach as a recommendation engine for art images, where the proposed hierarchical Bayesian method leads to excellent prediction performance.
منابع مشابه
A Multitask Point Process Predictive Model
Point process data are commonly observed in fields like healthcare and the social sciences. Designing predictive models for such event streams is an under-explored problem, due to often scarce training data. In this work we propose a multitask point process model, leveraging information from all tasks via a hierarchical Gaussian process (GP). Nonparametric learning functions implemented by a GP...
متن کاملA Practical View of Suboptimal Bayesian Classification with Radial Gaussian Kernels
For pattern classi cation in a multi dimensional space the minimum misclassi cation rate is obtained by using the Bayes criterion Kernel estimators or probabilistic neural networks provide a good way to evaluate the probability densities of each class of data and are an interesting parallel implementation of the Bayesian classi er However their training procedure leads to a very high number of ...
متن کاملLearning curves for multi-task Gaussian process regression
We study the average case performance of multi-task Gaussian process (GP) regression as captured in the learning curve, i.e. the average Bayes error for a chosen task versus the total number of examples n for all tasks. For GP covariances that are the product of an input-dependent covariance function and a free-form intertask covariance matrix, we show that accurate approximations for the learn...
متن کاملBayes Optimal Hyperplanes! Maximal Margin Hyperplanes
Maximal margin classifiers are a core technology in modern machine learning. They have strong theoretical justifications and have shown empirical successes. We provide an alternative justification for maximal margin hyperplane classifiers by relating them to Bayes optimal classifiers that use Parzen windows estimations with Gaussian kernels. For any value of the smoothing parameter (the width o...
متن کاملVarying-coefficient models with isotropic Gaussian process priors
We study learning problems in which the conditional distribution of the output given the input varies as a function of additional task variables. In varying-coefficient models with Gaussian process priors, a Gaussian process generates the functional relationship between the task variables and the parameters of this conditional. Varying-coefficient models subsume multitask models—such as hierarc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004